Incremental Composition in Distributional Semantics
نویسندگان
چکیده
Abstract Despite the incremental nature of Dynamic Syntax (DS), semantic grounding it remains that predicate logic, itself grounded in set theory, so is poorly suited to expressing rampantly context-relative word meaning, and related phenomena such as judgements similarity needed for modelling disambiguation. Here, we show how DS can be assigned a compositional distributional semantics which enables makes possible incrementally disambiguate language constructs using vector space semantics. Building on proposal our previous work, implement evaluate model real data, showing outperforms commonly used additive baseline. In conclusion, argue these results ground an account non-determinism lexical content, meaning its dependence surrounding context construal.
منابع مشابه
Denoising composition in distributional semantics
Since in distributional semantics we derive the embedding from a corpus, and the corpus is just a sample from the entire distribution, it is inevitable that the vectors obtained in the process will be somewhat noisy. In Section 2 we analyze this and other sources of noise, and in Section 3 we turn to the question of how much the considerations of compositionality discussed in Kornai and Kracht ...
متن کاملDISSECT - DIStributional SEmantics Composition Toolkit
We introduce DISSECT, a toolkit to build and explore computational models of word, phrase and sentence meaning based on the principles of distributional semantics. The toolkit focuses in particular on compositional meaning, and implements a number of composition methods that have been proposed in the literature. Furthermore, DISSECT can be useful to researchers and practitioners who need models...
متن کاملSeparating Disambiguation from Composition in Distributional Semantics
Most compositional-distributional models of meaning are based on ambiguous vector representations, where all the senses of a word are fused into the same vector. This paper provides evidence that the addition of a vector disambiguation step prior to the actual composition would be beneficial to the whole process, producing better composite representations. Furthermore, we relate this issue with...
متن کاملComposition in Distributional Models of Semantics
Vector-based models of word meaning have become increasingly popular in cognitive science. The appeal of these models lies in their ability to represent meaning simply by using distributional information under the assumption that words occurring within similar contexts are semantically similar. Despite their widespread use, vector-based models are typically directed at representing words in iso...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Logic, Language and Information
سال: 2021
ISSN: ['1572-9583', '0925-8531']
DOI: https://doi.org/10.1007/s10849-021-09337-8